Multivariate Dependence beyond Shannon Information

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multivariate Dependence beyond Shannon Information

Accurately determining dependency structure is critical to discovering a system’s causal organization. We recently showed that the transfer entropy fails in a key aspect of this—measuring information flow—due to its conflation of dyadic and polyadic relationships. We extend this observation to demonstrate that this is true of all such Shannon information measures when used to analyze multivaria...

متن کامل

Shannon Entropy and Mutual Information for Multivariate SkewElliptical Distributions

The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study...

متن کامل

Unique Additive Information Measures- Boltzmann-gibbs-shannon, Fisher and Beyond

It is proved that the only additive and isotropic information measure that can depend on the probability distribution and also on its first derivative is a linear combination of the Boltzmann-Gibbs-Shannon and Fisher information measures. Further possibilities are investigated, too.

متن کامل

What is Shannon information?

entity; it is always tied to a physical representation. It is represented by engraving on a stone tablet, a spin, a charge, a hole in a punched card, a mark on a paper, or some other equivalent.” (1996, p. 188; see also Landauer 1991). This view is also adopted by some philosophers of science; for instance, Peter Kosso states that “information is transferred between states through interaction.”...

متن کامل

Information theory, multivariate dependence, and genetic network inference

We define the concept of dependence among multiple variables using maximum entropy techniques and introduce a graphical notation to denote the dependencies. Direct inference of information theoretic quantities from data uncovers dependencies even in undersampled regimes when the joint probability distribution cannot be reliably estimated. The method is tested on synthetic data. We anticipate it...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Entropy

سال: 2017

ISSN: 1099-4300

DOI: 10.3390/e19100531